Skip to content

Conversation

@zhangtao0408
Copy link

What does this PR do?

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Refactor FluxAttention to include optional  dual stream calculation and integrate mindiesd attention forward.
Removed the _context_parallel_forward method, which handled context parallel forward operations for attention mechanisms.
Remove unused stream and event variables in transformer_flux.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant